151 research outputs found
On probabilistic analog automata
We consider probabilistic automata on a general state space and study their
computational power. The model is based on the concept of language recognition
by probabilistic automata due to Rabin and models of analog computation in a
noisy environment suggested by Maass and Orponen, and Maass and Sontag. Our
main result is a generalization of Rabin's reduction theorem that implies that
under very mild conditions, the computational power of the automaton is limited
to regular languages
Using the gibbs function as a measure of human brain development trends from fetal stage to advanced age
We propose to use a Gibbs free energy function as a measure of the human brain development. We adopt this approach to the development of the human brain over the human lifespan: from a prenatal stage to advanced age. We used proteomic expression data with the Gibbs free energy to quantify human brain’s protein–protein interaction networks. The data, obtained from BioGRID, comprised tissue samples from the 16 main brain areas, at different ages, of 57 post-mortem human brains. We found a consistent functional dependence of the Gibbs free energies on age for most of the areas and both sexes. A significant upward trend in the Gibbs function was found during the fetal stages, which is followed by a sharp drop at birth with a subsequent period of relative stability and a final upward trend toward advanced age. We interpret these data in terms of structure formation followed by its stabilization and eventual deterioration. Furthermore, gender data analysis has uncovered the existence of functional differences, showing male Gibbs function values lower than female at prenatal and neonatal ages, which become higher at ages 8 to 40 and finally converging at late adulthood with the corresponding female Gibbs functions
Towards Formal Verification of Computations and Hypercomputations in Relativistic Physics
It is now more than 15 years since Copeland and Proudfoot introduced the term hypercomputation. Although no hypercomputer has yet been built (and perhaps never will be), it is instructive to consider what properties any such device should possess, and whether these requirements could ever be met. Aside from the potential benefits that would accrue from a positive outcome, the issues raised are sufficiently disruptive that they force us to re-evaluate existing computability theory. From a foundational viewpoint the questions driving hypercomputation theory remain the same as those addressed since the earliest days of computer science, viz. what is computation? and what can be computed? Early theoreticians developed models of computation that are independent of both their implementation and their physical location, but it has become clear in recent decades that these aspects of computation cannot always be neglected. In particular, the computational power of a distributed system can be expected to vary according to the spacetime geometry in which the machines on which it is running are located. The power of a computing system therefore depends on its physical environment and cannot be specified in absolute terms. Even Turing machines are capable of super-Turing behaviour, given the right environment
Squamous cell carcinoma of the breast: a case report
<p>Abstract</p> <p>Background</p> <p>Squamous cells are normally not found inside the breast, so a primary squamous cell carcinoma of the breast is an exceptional phenomenon. There is a possible explanation for these findings.</p> <p>Case presentation</p> <p>A 72-year-old woman presented with a breast abnormality suspected for breast carcinoma. After the operation the pathological examination revealed a primary squamous cell carcinoma of the breast.</p> <p>Conclusion</p> <p>The presentation of squamous cell carcinoma could be similar to that of an adenocarcinoma. However, a squamous cell carcinoma of the breast could also develop from a complicated breast cyst or abscess. Therefore, pathological examination of these apparent benign abnormalities is mandatory.</p
A Survey on Continuous Time Computations
We provide an overview of theories of continuous time computation. These
theories allow us to understand both the hardness of questions related to
continuous time dynamical systems and the computational power of continuous
time analog models. We survey the existing models, summarizing results, and
point to relevant references in the literature
On the Bounds of Function Approximations
Within machine learning, the subfield of Neural Architecture Search (NAS) has
recently garnered research attention due to its ability to improve upon
human-designed models. However, the computational requirements for finding an
exact solution to this problem are often intractable, and the design of the
search space still requires manual intervention. In this paper we attempt to
establish a formalized framework from which we can better understand the
computational bounds of NAS in relation to its search space. For this, we first
reformulate the function approximation problem in terms of sequences of
functions, and we call it the Function Approximation (FA) problem; then we show
that it is computationally infeasible to devise a procedure that solves FA for
all functions to zero error, regardless of the search space. We show also that
such error will be minimal if a specific class of functions is present in the
search space. Subsequently, we show that machine learning as a mathematical
problem is a solution strategy for FA, albeit not an effective one, and further
describe a stronger version of this approach: the Approximate Architectural
Search Problem (a-ASP), which is the mathematical equivalent of NAS. We
leverage the framework from this paper and results from the literature to
describe the conditions under which a-ASP can potentially solve FA as well as
an exhaustive search, but in polynomial time.Comment: Accepted as a full paper at ICANN 2019. The final, authenticated
publication will be available at https://doi.org/10.1007/978-3-030-30487-4_3
- …